AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multilingual pretraining fine-tuning

# Multilingual pretraining fine-tuning

Mbart Large 50 Finetuned Ar Wikilingua
Text summarization model fine-tuned on Arabic wiki_lingua dataset based on mbart-large-50
Text Generation Transformers
M
ahmeddbahaa
21
0
Mbart Large Cc25 Cnn Dailymail Nl
A Dutch news summarization model fine-tuned based on the MBART architecture, developed by ml6team
Text Generation Transformers Other
M
ml6team
75
6
Vakyansh Wav2vec2 Punjabi Pam 10
MIT
A Punjabi automatic speech recognition model fine-tuned based on the CLSRIL-23 multilingual pretrained model, supporting 16kHz sampled speech input.
Speech Recognition Transformers Other
V
Harveenchadha
96
0
Vakyansh Wav2vec2 Tamil Tam 250
MIT
Tamil automatic speech recognition model based on Wav2Vec2 architecture, developed by Harveen Chadha, fine-tuned on 4200 hours of Hindi data
Speech Recognition Transformers Other
V
Harveenchadha
1,843
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase